193 research outputs found

    Bayesian subset simulation

    Full text link
    We consider the problem of estimating a probability of failure α\alpha, defined as the volume of the excursion set of a function f:X⊆Rd→Rf:\mathbb{X} \subseteq \mathbb{R}^{d} \to \mathbb{R} above a given threshold, under a given probability measure on X\mathbb{X}. In this article, we combine the popular subset simulation algorithm (Au and Beck, Probab. Eng. Mech. 2001) and our sequential Bayesian approach for the estimation of a probability of failure (Bect, Ginsbourger, Li, Picheny and Vazquez, Stat. Comput. 2012). This makes it possible to estimate α\alpha when the number of evaluations of ff is very limited and α\alpha is very small. The resulting algorithm is called Bayesian subset simulation (BSS). A key idea, as in the subset simulation algorithm, is to estimate the probabilities of a sequence of excursion sets of ff above intermediate thresholds, using a sequential Monte Carlo (SMC) approach. A Gaussian process prior on ff is used to define the sequence of densities targeted by the SMC algorithm, and drive the selection of evaluation points of ff to estimate the intermediate probabilities. Adaptive procedures are proposed to determine the intermediate thresholds and the number of evaluations to be carried out at each stage of the algorithm. Numerical experiments illustrate that BSS achieves significant savings in the number of function evaluations with respect to other Monte Carlo approaches

    A Bayesian approach to constrained single- and multi-objective optimization

    Get PDF
    This article addresses the problem of derivative-free (single- or multi-objective) optimization subject to multiple inequality constraints. Both the objective and constraint functions are assumed to be smooth, non-linear and expensive to evaluate. As a consequence, the number of evaluations that can be used to carry out the optimization is very limited, as in complex industrial design optimization problems. The method we propose to overcome this difficulty has its roots in both the Bayesian and the multi-objective optimization literatures. More specifically, an extended domination rule is used to handle objectives and constraints in a unified way, and a corresponding expected hyper-volume improvement sampling criterion is proposed. This new criterion is naturally adapted to the search of a feasible point when none is available, and reduces to existing Bayesian sampling criteria---the classical Expected Improvement (EI) criterion and some of its constrained/multi-objective extensions---as soon as at least one feasible point is available. The calculation and optimization of the criterion are performed using Sequential Monte Carlo techniques. In particular, an algorithm similar to the subset simulation method, which is well known in the field of structural reliability, is used to estimate the criterion. The method, which we call BMOO (for Bayesian Multi-Objective Optimization), is compared to state-of-the-art algorithms for single- and multi-objective constrained optimization

    Bayesian Subset Simulation: a kriging-based subset simulation algorithm for the estimation of small probabilities of failure

    Full text link
    The estimation of small probabilities of failure from computer simulations is a classical problem in engineering, and the Subset Simulation algorithm proposed by Au & Beck (Prob. Eng. Mech., 2001) has become one of the most popular method to solve it. Subset simulation has been shown to provide significant savings in the number of simulations to achieve a given accuracy of estimation, with respect to many other Monte Carlo approaches. The number of simulations remains still quite high however, and this method can be impractical for applications where an expensive-to-evaluate computer model is involved. We propose a new algorithm, called Bayesian Subset Simulation, that takes the best from the Subset Simulation algorithm and from sequential Bayesian methods based on kriging (also known as Gaussian process modeling). The performance of this new algorithm is illustrated using a test case from the literature. We are able to report promising results. In addition, we provide a numerical study of the statistical properties of the estimator.Comment: 11th International Probabilistic Assessment and Management Conference (PSAM11) and The Annual European Safety and Reliability Conference (ESREL 2012), Helsinki : Finland (2012

    Quantifying uncertainties on excursion sets under a Gaussian random field prior

    Get PDF
    We focus on the problem of estimating and quantifying uncertainties on the excursion set of a function under a limited evaluation budget. We adopt a Bayesian approach where the objective function is assumed to be a realization of a Gaussian random field. In this setting, the posterior distribution on the objective function gives rise to a posterior distribution on excursion sets. Several approaches exist to summarize the distribution of such sets based on random closed set theory. While the recently proposed Vorob'ev approach exploits analytical formulae, further notions of variability require Monte Carlo estimators relying on Gaussian random field conditional simulations. In the present work we propose a method to choose Monte Carlo simulation points and obtain quasi-realizations of the conditional field at fine designs through affine predictors. The points are chosen optimally in the sense that they minimize the posterior expected distance in measure between the excursion set and its reconstruction. The proposed method reduces the computational costs due to Monte Carlo simulations and enables the computation of quasi-realizations on fine designs in large dimensions. We apply this reconstruction approach to obtain realizations of an excursion set on a fine grid which allow us to give a new measure of uncertainty based on the distance transform of the excursion set. Finally we present a safety engineering test case where the simulation method is employed to compute a Monte Carlo estimate of a contour line

    Sequential search based on kriging: convergence analysis of some algorithms

    Full text link
    Let \FF be a set of real-valued functions on a set \XX and let S:\FF \to \GG be an arbitrary mapping. We consider the problem of making inference about S(f)S(f), with f\in\FF unknown, from a finite set of pointwise evaluations of ff. We are mainly interested in the problems of approximation and optimization. In this article, we make a brief review of results concerning average error bounds of Bayesian search methods that use a random process prior about ff

    Optimisation bayésienne par méthodes SMC

    No full text
    International audienceLe problème considéré est l'optimisation d'une fonction réelle f à l'aide d'une approche bayésienne. Les évaluations de f sont choisies séquentiellement à partir d'informations a priori sur la fonction f, modélisée par un processus aléatoire, et des évaluations précédentes. Cette approche présente deux problèmes, à savoir l'estimation des lois a posteriori de paramètres intervenant dans le choix des points d'évaluations, et la maximisation du critère utilisé pour déterminer ce choix. Dans cet article, nous proposons une approche SMC (Sequential Monte Carlo) pour résoudre ces deux problèmes de façon simultanée

    Gaussian process modeling for stochastic multi-fidelity simulators, with application to fire safety

    Full text link
    To assess the possibility of evacuating a building in case of a fire, a standard method consists in simulating the propagation of fire, using finite difference methods and takes into account the random behavior of the fire, so that the result of a simulation is non-deterministic. The mesh fineness tunes the quality of the numerical model, and its computational cost. Depending on the mesh fineness, one simulation can last anywhere from a few minutes to several weeks. In this article, we focus on predicting the behavior of the fire simulator at fine meshes, using cheaper results, at coarser meshes. In the literature of the design and analysis of computer experiments, such a problem is referred to as multi-fidelity prediction. Our contribution is to extend to the case of stochastic simulators the Bayesian multi-fidelity model proposed by Picheny and Ginsbourger (2013) and Tuo et al. (2014)

    Relabeling and Summarizing Posterior Distributions in Signal Decomposition Problems when the Number of Components is Unknown

    Get PDF
    International audienceThis paper addresses the problems of relabeling and summarizing posterior distributions that typically arise, in a Bayesian framework, when dealing with signal decomposition problems with an unknown number of components. Such posterior distributions are defined over union of subspaces of differing dimensionality and can be sampled from using modern Monte Carlo techniques, for instance the increasingly popular RJ-MCMC method. No generic approach is available, however, to summarize the resulting variable-dimensional samples and extract from them component-specific parameters. We propose a novel approach, named Variable-dimensional Approximate Posterior for Relabeling and Summarizing (VAPoRS), to this problem, which consists in approximating the posterior distribution of interest by a "simple"---but still variable-dimensional---parametric distribution. The distance between the two distributions is measured using the Kullback-Leibler divergence, and a Stochastic EM-type algorithm, driven by the RJ-MCMC sampler, is proposed to estimate the parameters. Two signal decomposition problems are considered, to show the capability of VAPoRS both for relabeling and for summarizing variable dimensional posterior distributions: the classical problem of detecting and estimating sinusoids in white Gaussian noise on the one hand, and a particle counting problem motivated by the Pierre Auger project in astrophysics on the other hand

    Summarizing Posterior Distributions in Signal Decomposition Problems when the Number of Components is Unknown

    No full text
    International audienceThis paper addresses the problem of summarizing the posterior distributions that typically arise, in a Bayesian framework, when dealing with signal decomposition problems with unknown number of components. Such posterior distributions are defined over union of subspaces of differing dimensionality and can be sampled from using modern Monte Carlo techniques, for instance the increasingly popular RJ-MCMC method. No generic approach is available, however, to summarize the resulting variable-dimensional samples and extract from them component-specific parameters. We propose a novel approach to this problem, which consists in approximating the complex posterior of interest by a "simple"--but still variable-dimensional--parametric distribution. The distance between the two distributions is measured using the Kullback- Leibler divergence, and a Stochastic EM-type algorithm, driven by the RJ-MCMC sampler, is proposed to estimate the parameters. The proposed algorithm is illustrated on the fundamental signal processing example of joint detection and estimation of sinusoids in white Gaussian noise
    • …
    corecore